Keyword : Category :
 
 
 
 
Windows
Unix
Php and mysql
Linux
Java
Mobile programming
Certification
Asterisk
Python
Autocad
3d-animation
Hacks
Programming
Hardware
Networking
Web design
Multimedia - graphics
Simple steps
Dummies
.net programming
Oracle
Sql server
Operating system
Telecommunications
Microsoft
Office
Web development
Cisco
Graphics
C sharp
Software development
Database
Computer science
Uml
Security
General
Cms
Mac
Android
 
 
Email:
 
 
MURACH'S HTML5 AND CSS3, 3RD EDITION
NRS 2240.00
 
Book details / order
MACHINE LEARNING IN PYTHON
Machine learning in python shows you how to successfully analyze data using only two core machine learning algorithms and how to apply them using python. by focusing on two algorithm families that effectively predict outcomes, this book is able to provide full descriptions of the mechanisms at work and the examples that illustrate the machinery with specific, hackable code. the algorithms are explained in simple terms with no complex math and applied using python, with guidance on algorithm selection, data preparation and using the trained models in practice. contents : - introduction chapter 1 the two essential algorithms for making predictions ·why are these two algorithms so useful? ·what are penalized regression methods? ·what are ensemble methods? ·how to decide which algorithm to use ·the process steps for building a predictive model ·framing a machine learning problem ·feature extraction and feature engineering ·determining performance of a trained model ·chapter contents and dependencies chapter 2 understand the problem by understanding the data ·the anatomy of a new problem ·different types of attributes and labels drive modeling choices ·things to notice about your new data set ·classification problems: detecting unexploded mines using sonar ·physical characteristics of the rocks versus mines data set ·statistical summaries of the rocks versus mines data set ·visualization of outliers using quantile]quantile plot ·statistical characterization of categorical attributes ·how to use python pandas to summarize the rocks versus mines data set ·visualizing properties of the rocks versus mines data set ·visualizing with parallel coordinates plots ·visualizing interrelationships between attributes and labels ·visualizing attribute and label correlations using a heat map ·summarizing the process for understanding rocks versus mines data set ·real]valued predictions with factor variables: how old is your abalone? ·parallel coordinates for regression problems--visualize variable relationships for abalone problem ·how to use correlation heat map for regression--visualize pair]wise correlations for the abalone problem ·real]valued predictions using real]valued attributes: calculate how your wine tastes ·multiclass classification problem: what type of glass is that? chapter 3 predictive model building: balancing performance, complexity and big data ·the basic problem: understanding function approximation ·working with training data ·assessing performance of predictive models ·factors driving algorithm choices and performance--complexity and data ·contrast between a simple problem and a complex problem ·contrast between a simple model and a complex model ·factors driving predictive algorithm performance ·choosing an algorithm: linear or nonlinear? ·measuring the performance of predictive models ·performance measures for different types of problems ·simulating performance of deployed models ·achieving harmony between model and data ·choosing a model to balance problem complexity, model complexity and data set size ·using forward stepwise regression to control over fitting ·evaluating and understanding your predictive model ·control over fitting by penalizing regression ·coefficients--ridge regression chapter 4 penalized linear regression ·why penalized linear regression methods are so useful ·extremely fast coefficient estimation ·variable importance information ·extremely fast evaluation when deployed ·reliable performance ·sparse solutions ·problem may require linear model ·when to use ensemble methods ·penalized linear regression: regulating linear regression for optimum performance ·training linear models: minimizing errors and more ·adding a coefficient penalty to the ols formulation ·other useful coefficient penalties--manhattan and elastic net ·why lasso penalty leads to sparse coefficient vectors ·elastic net penalty includes both lasso and ridge ·solving the penalized linear regression problem ·understanding least angle regression and its relationship to forward stepwise regression ·how lars generates hundreds of models of varying complexity ·choosing the best model from the hundreds lars generates ·using glmnet: very fast and very general ·comparison of the mechanics of glmnet and lars algorithms ·initializing and iterating the glmnet algorithm ·extensions to linear regression with numeric input ·solving classification problems with penalized regression ·working with classification problems having more than two outcomes ·understanding basis expansion: using linear methods on nonlinear problems ·incorporating non-numeric attributes into linear methods chapter 5 building predictive models using penalized linear methods ·python packages for penalized linear regression ·multivariable regression: predicting wine taste ·building and testing a model to predict wine taste ·training on the whole data set before deployment ·basis expansion: improving performance by creating new variables from old ones ·binary classification: using penalized linear regression to detect unexploded mines ·build a rocks versus mines classifier for deployment ·multiclass classification: classifying crime scene ·glass samples chapter 6 ensemble methods ·binary decision trees ·how a binary decision tree generates predictions ·how to train a binary decision tree ·tree training equals split point selection ·how split point selection affects predictions ·algorithm for selecting split points ·multivariable tree training--which attribute to split? ·recursive splitting for more tree depth ·over fitting binary trees ·measuring over fit with binary trees ·balancing binary tree complexity for best performance ·modifications for classification and categorical features ·bootstrap aggregation: "bagging" ·how does the bagging algorithm work? ·bagging performance--bias versus variance ·how bagging behaves on multivariable problem ·bagging needs tree depth for performance ·summary of bagging ·gradient boosting ·basic principle of gradient boosting algorithm ·parameter settings for gradient boosting ·how gradient boosting iterates toward a predictive model ·getting the best performance from gradient boosting ·gradient boosting on a multivariable problem ·summary for gradient boosting ·random forest ·random forests: bagging plus random attribute subsets ·random forests performance drivers ·random forests summary chapter 7 building ensemble models with python ·solving regression problems with python ensemble packages ·building a random forest model to predict wine taste ·constructing a random forest regressor object ·modeling wine taste with random forest regressor ·visualizing the performance of a random ·forests regression model ·using gradient boosting to predict wine taste ·using the class constructor for gradient boosting regressor ·using gradient boosting regressor to implement a regression model ·assessing the performance of a gradient boosting model ·coding bagging to predict wine taste ·incorporating non-numeric attributes in python ensemble models ·coding the sex of abalone for input to random forest regression in python ·assessing performance and the importance of coded variables ·coding the sex of abalone for gradient boosting regression in python ·assessing performance and the importance of coded variables with gradient boosting ·solving binary classification problems with python ensemble methods ·detecting unexploded mines with python random forest ·constructing a random forests model to detect unexploded mines ·determining the performance of a random forests classifier ·detecting unexploded mines with python gradient boosting ·determining the performance of a gradient boosting classifier ·solving multiclass classification problems with python ensemble methods ·classifying glass with random forests ·dealing with class imbalances ·classifying glass using gradient boosting ·assessing the advantage of using random forest base learners with gradient boosting ·comparing algorithms summary index.

Author : Michael bowles
Publication : Wiley
Isbn : 9788126555925
Store book number : 107
NRS 960.00
  
Order This Book
*Marked Field Is Necessary
Your Name: *
Your Address:
Your Email: *
Your Cell Phone:
Your Work Phone:
Quantity: *
Total:
Message (if any)
Security code: *
Case Sensitive
 
 
Packt publication
Microsoft press
Wrox
Bpb
Phi
Dreamtech press
Sybex
Wiley
Tata
Oreilly
Macmilan
Vikas
Apress
Spd
Pearson
Cambridge
Oxford
Idg
Charles river media
Murach
Niit
Black book
Bible
Elsevier
Sk kataria
Pragmatic bookshelf
Fusion books
 
 
PROGRAMMING PYTHON: POWERFUL OBJECT-ORIENTED PROGRAMMING (COVERS PYTHON 3.X), 4TH EDITION
NRS 4480.00
 
 
Professional ASP.NET MVC 4
Mastering Microsoft Exchange ...
Android Hacker's Handbook
CCNA Cisco Certified Network ...
Windows Phone 7 Application ...
Beginning Drupal (Wrox Progr ...
Troubleshooting Windows 7 In ...
 More>>
 
All Right Reserved © bookplus.com.np 2008